On Nesterov’s nonsmooth Chebyshev–Rosenbrock functions
نویسندگان
چکیده
We discuss two nonsmooth functions on Rn introduced by Nesterov. We show that the first variant is partly smooth in the sense of Lewis and that its only stationary point is the global minimizer. In contrast, we show that the second variant has 2n−1 Clarke stationary points, none of them local minimizers except the global minimizer, but also that its only Mordukhovich stationary point is the global minimizer. Nonsmooth optimization algorithms frommultiple starting points generate iterates that approximate all 2n−1 Clarke stationary points, not only the global minimizer, but it remains an open question as to whether the nonminimizing Clarke stationary points are actually points of attraction for optimization algorithms. Published by Elsevier Ltd
منابع مشابه
The exponential Rosenbrock-Euler method for nonsmooth initial data
We consider the exponential Rosenbrock-Euler method for the solution of nonlinear parabolic abstract ordinary differential equations. For smooth solutions the convergence analysis is known, and the method is of order two. Here we investigate the convergence of the method in the case of nonsmooth initial conditions which result in derivatives of the solution with singularities at the origin. We ...
متن کاملOn Sequential Optimality Conditions without Constraint Qualifications for Nonlinear Programming with Nonsmooth Convex Objective Functions
Sequential optimality conditions provide adequate theoretical tools to justify stopping criteria for nonlinear programming solvers. Here, nonsmooth approximate gradient projection and complementary approximate Karush-Kuhn-Tucker conditions are presented. These sequential optimality conditions are satisfied by local minimizers of optimization problems independently of the fulfillment of constrai...
متن کاملA note about the complexity of minimizing Nesterov's smooth Chebyshev-Rosenbrock function
This short note considers and resolves the apparent contradiction between known worst-case complexity results for first and second-order methods for solving unconstrained smooth nonconvex optimization problems and a recent note by Jarre (2011) implying a very large lower bound on the number of iterations required to reach the solution’s neighbourhood for a specific problem with variable dimension.
متن کاملStochastic Coordinate Descent for Nonsmooth Convex Optimization
Stochastic coordinate descent, due to its practicality and efficiency, is increasingly popular in machine learning and signal processing communities as it has proven successful in several large-scale optimization problems , such as l1 regularized regression, Support Vector Machine, to name a few. In this paper, we consider a composite problem where the nonsmoothness has a general structure that...
متن کاملOn error estimates for Galerkin spectral discretizations of parabolic problems with nonsmooth initial data
We analyze the Legendre and Chebyshev spectral Galerkin semidiscretizations of a one dimensional homogeneous parabolic problem with nonconstant coefficients. We present error estimates for both smooth and nonsmooth data. In the Chebyshev case a limit in the order of approximation is established. On the contrary, in the Legendre case we find an arbitrary high order of convegence.
متن کامل